Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 45
Filtrar
1.
iScience ; 27(3): 109092, 2024 Mar 15.
Artigo em Inglês | MEDLINE | ID: mdl-38405611

RESUMO

It has been suggested that our brain re-uses body-based computations to localize touch on tools, but the neural implementation of this process remains unclear. Neural oscillations in the alpha and beta frequency bands are known to map touch on the body in external and skin-centered coordinates, respectively. Here, we pinpointed the role of these oscillations during tool-extended sensing by delivering tactile stimuli to either participants' hands or the tips of hand-held rods. To disentangle brain responses related to each coordinate system, we had participants' hands/tool tips crossed or uncrossed at their body midline. We found that midline crossing modulated alpha (but not beta) band activity similarly for hands and tools, also involving a similar network of cortical regions. Our findings strongly suggest that the brain uses similar oscillatory mechanisms for mapping touch on the body and tools, supporting the idea that body-based neural processes are repurposed for tool use.

2.
Autism ; 28(2): 415-432, 2024 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-37226824

RESUMO

LAY ABSTRACT: A vast majority of individuals with autism spectrum disorder experience impairments in motor skills. Those are often labelled as additional developmental coordination disorder despite the lack of studies comparing both disorders. Consequently, motor skills rehabilitation programmes in autism are often not specific but rather consist in standard programmes for developmental coordination disorder. Here, we compared motor performance in three groups of children: a control group, an autism spectrum disorder group and a developmental coordination disorder group. Despite similar level of motor skills evaluated by the standard movement assessment battery for children, in a Reach-to-Displace Task, children with autism spectrum disorder and developmental coordination disorder showed specific motor control deficits. Children with autism spectrum disorder failed to anticipate the object properties, but could correct their movement as well as typically developing children. In contrast, children with developmental coordination disorder were atypically slow, but showed a spared anticipation. Our study has important clinical implications as motor skills rehabilitations are crucial to both populations. Specifically, our findings suggest that individuals with autism spectrum disorder would benefit from therapies aiming at improving their anticipation, maybe through the support of their preserved representations and use of sensory information. Conversely, individuals with developmental coordination disorder would benefit from a focus on the use of sensory information in a timely fashion.


Assuntos
Transtorno do Espectro Autista , Transtorno Autístico , Transtornos das Habilidades Motoras , Criança , Humanos , Destreza Motora , Movimento
3.
Front Neurol ; 14: 1151515, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-37064179

RESUMO

Objectives: Virtual reality (VR) offers an ecological setting and the possibility of altered visual feedback during head movements useful for vestibular research and treatment of vestibular disorders. There is however no data quantifying vestibulo-ocular reflex (VOR) during head impulse test (HIT) in VR. The main objective of this study is to assess the feasibility and performance of eye and head movement measurements of healthy subjects in a VR environment during high velocity horizontal head rotation (VR-HIT) under a normal visual feedback condition. The secondary objective is to establish the feasibility of VR-HIT recordings in the same group of normal subjects but under altered visual feedback conditions. Design: Twelve healthy subjects underwent video HIT using both a standard setup (vHIT) and VR-HIT. In VR, eye and head positions were recorded by using, respectively, an imbedded eye tracker and an infrared motion tracker. Subjects were tested under four conditions, one reproducing normal visual feedback and three simulating an altered gain or direction of visual feedback. During these three altered conditions the movement of the visual scene relative to the head movement was decreased in amplitude by 50% (half), was nullified (freeze) or was inverted in direction (inverse). Results: Eye and head motion recording during normal visual feedback as well as during all 3 altered conditions was successful. There was no significant difference in VOR gain in VR-HIT between normal, half, freeze and inverse conditions. In the normal condition, VOR gain was significantly but slightly (by 3%) different for VR-HIT and vHIT. Duration and amplitude of head impulses were significantly greater in VR-HIT than in vHIT. In all three altered VR-HIT conditions, covert saccades were present in approximatively one out of four trials. Conclusion: Our VR setup allowed high quality recording of eye and head data during head impulse test under normal and altered visual feedback conditions. This setup could be used to investigate compensation mechanisms in vestibular hypofunction, to elicit adaptation of VOR in ecological settings or to allow objective evaluation of VR-based vestibular rehabilitation.

4.
Eur Arch Otorhinolaryngol ; 280(8): 3661-3672, 2023 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-36905419

RESUMO

BACKGROUND AND PURPOSE: Use of unilateral cochlear implant (UCI) is associated with limited spatial hearing skills. Evidence that training these abilities in UCI user is possible remains limited. In this study, we assessed whether a Spatial training based on hand-reaching to sounds performed in virtual reality improves spatial hearing abilities in UCI users METHODS: Using a crossover randomized clinical trial, we compared the effects of a Spatial training protocol with those of a Non-Spatial control training. We tested 17 UCI users in a head-pointing to sound task and in an audio-visual attention orienting task, before and after each training.
Study is recorded in clinicaltrials.gov (NCT04183348). RESULTS: During the Spatial VR training, sound localization errors in azimuth decreased. Moreover, when comparing head-pointing to sounds before vs. after training, localization errors decreased after the Spatial more than the control training. No training effects emerged in the audio-visual attention orienting task. CONCLUSIONS: Our results showed that sound localization in UCI users improves during a Spatial training, with benefits that extend also to a non-trained sound localization task (generalization). These findings have potentials for novel rehabilitation procedures in clinical contexts.


Assuntos
Implante Coclear , Implantes Cocleares , Localização de Som , Percepção da Fala , Humanos , Audição , Implante Coclear/métodos , Testes Auditivos/métodos
5.
Ear Hear ; 44(1): 189-198, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-35982520

RESUMO

OBJECTIVES: We assessed if spatial hearing training improves sound localization in bilateral cochlear implant (BCI) users and whether its benefits can generalize to untrained sound localization tasks. DESIGN: In 20 BCI users, we assessed the effects of two training procedures (spatial versus nonspatial control training) on two different tasks performed before and after training (head-pointing to sound and audiovisual attention orienting). In the spatial training, participants identified sound position by reaching toward the sound sources with their hand. In the nonspatial training, comparable reaching movements served to identify sound amplitude modulations. A crossover randomized design allowed comparison of training procedures within the same participants. Spontaneous head movements while listening to the sounds were allowed and tracked to correlate them with localization performance. RESULTS: During spatial training, BCI users reduced their sound localization errors in azimuth and adapted their spontaneous head movements as a function of sound eccentricity. These effects generalized to the head-pointing sound localization task, as revealed by greater reduction of sound localization error in azimuth and more accurate first head-orienting response, as compared to the control nonspatial training. BCI users benefited from auditory spatial cues for orienting visual attention, but the spatial training did not enhance this multisensory attention ability. CONCLUSIONS: Sound localization in BCI users improves with spatial reaching-to-sound training, with benefits to a nontrained sound localization task. These findings pave the way to novel rehabilitation procedures in clinical contexts.


Assuntos
Implante Coclear , Implantes Cocleares , Localização de Som , Humanos , Percepção Auditiva/fisiologia , Implante Coclear/métodos , Audição/fisiologia , Testes Auditivos/métodos , Localização de Som/fisiologia , Estudos Cross-Over
6.
Front Hum Neurosci ; 16: 981330, 2022.
Artigo em Inglês | MEDLINE | ID: mdl-36248682

RESUMO

When describing motion along both the horizontal and vertical axes, languages from different families express the elements encoding verticality before those coding for horizontality (e.g., going up right instead of right up). In light of the motor grounding of language, the present study investigated whether the prevalence of verticality in Path expression also governs the trajectory of arm biological movements. Using a 3D virtual-reality setting, we tracked the kinematics of hand pointing movements in five spatial directions, two of which implied the vertical and horizontal vectors equally (i.e., up right +45° and bottom right -45°). Movement onset could be prompted by visual or auditory verbal cues, the latter being canonical in French ("en haut à droite"/up right) or not ("à droite en haut"/right up). In two experiments, analyses of the index finger kinematics revealed a significant effect of gravity, with earlier acceleration, velocity, and deceleration peaks for upward (+45°) than downward (-45°) movements, irrespective of the instructions. Remarkably, confirming the linguistic observations, we found that vertical kinematic parameters occurred earlier than horizontal ones for upward movements, both for visual and congruent verbal cues. Non-canonical verbal instructions significantly affected this temporal dynamic: for upward movements, the horizontal and vertical components temporally aligned, while they reversed for downward movements where the kinematics of the vertical axis was delayed with respect to that of the horizontal one. This temporal dynamic is so deeply anchored that non-canonical verbal instructions allowed for horizontality to precede verticality only for movements that do not fight against gravity. Altogether, our findings provide new insights into the embodiment of language by revealing that linguistic path may reflect the organization of biological movements, giving priority to the vertical axis.

7.
PLoS One ; 17(4): e0263509, 2022.
Artigo em Inglês | MEDLINE | ID: mdl-35421095

RESUMO

Localising sounds means having the ability to process auditory cues deriving from the interplay among sound waves, the head and the ears. When auditory cues change because of temporary or permanent hearing loss, sound localization becomes difficult and uncertain. The brain can adapt to altered auditory cues throughout life and multisensory training can promote the relearning of spatial hearing skills. Here, we study the training potentials of sound-oriented motor behaviour to test if a training based on manual actions toward sounds can learning effects that generalize to different auditory spatial tasks. We assessed spatial hearing relearning in normal hearing adults with a plugged ear by using visual virtual reality and body motion tracking. Participants performed two auditory tasks that entail explicit and implicit processing of sound position (head-pointing sound localization and audio-visual attention cueing, respectively), before and after having received a spatial training session in which they identified sound position by reaching to auditory sources nearby. Using a crossover design, the effects of the above-mentioned spatial training were compared to a control condition involving the same physical stimuli, but different task demands (i.e., a non-spatial discrimination of amplitude modulations in the sound). According to our findings, spatial hearing in one-ear plugged participants improved more after reaching to sound trainings rather than in the control condition. Training by reaching also modified head-movement behaviour during listening. Crucially, the improvements observed during training generalize also to a different sound localization task, possibly as a consequence of acquired and novel head-movement strategies.


Assuntos
Sinais (Psicologia) , Localização de Som , Estimulação Acústica , Adaptação Fisiológica , Adulto , Percepção Auditiva , Estudos Cross-Over , Audição , Humanos
8.
Cereb Cortex ; 32(18): 3896-3916, 2022 09 04.
Artigo em Inglês | MEDLINE | ID: mdl-34979550

RESUMO

Saccadic adaptation ($SA$) is a cerebellar-dependent learning of motor commands ($MC$), which aims at preserving saccade accuracy. Since $SA$ alters visual localization during fixation and even more so across saccades, it could also involve changes of target and/or saccade visuospatial representations, the latter ($CDv$) resulting from a motor-to-visual transformation (forward dynamics model) of the corollary discharge of the $MC$. In the present study, we investigated if, in addition to its established role in adaptive adjustment of $MC$, the cerebellum could contribute to the adaptation-associated perceptual changes. Transfer of backward and forward adaptation to spatial perceptual performance (during ocular fixation and trans-saccadically) was assessed in eight cerebellar patients and eight healthy volunteers. In healthy participants, both types of $SA$ altered $MC$ as well as internal representations of the saccade target and of the saccadic eye displacement. In patients, adaptation-related adjustments of $MC$ and adaptation transfer to localization were strongly reduced relative to healthy participants, unraveling abnormal adaptation-related changes of target and $CDv$. Importantly, the estimated changes of $CDv$ were totally abolished following forward session but mainly preserved in backward session, suggesting that an internal model ensuring trans-saccadic localization could be located in the adaptation-related cerebellar networks or in downstream networks, respectively.


Assuntos
Adaptação Fisiológica , Movimentos Sacádicos , Cerebelo , Humanos
9.
J Cogn Neurosci ; 34(4): 675-686, 2022 03 05.
Artigo em Inglês | MEDLINE | ID: mdl-35061032

RESUMO

The sense of touch is not restricted to the body but can also extend to external objects. When we use a handheld tool to contact an object, we feel the touch on the tool and not in the hand holding the tool. The ability to perceive touch on a tool actually extends along its entire surface, allowing the user to accurately localize where it is touched similarly as they would on their body. Although the neural mechanisms underlying the ability to localize touch on the body have been largely investigated, those allowing to localize touch on a tool are still unknown. We aimed to fill this gap by recording the electroencephalography signal of participants while they localized tactile stimuli on a handheld rod. We focused on oscillatory activity in the alpha (7-14 Hz) and beta (15-30 Hz) ranges, as they have been previously linked to distinct spatial codes used to localize touch on the body. Beta activity reflects the mapping of touch in skin-based coordinates, whereas alpha activity reflects the mapping of touch in external space. We found that alpha activity was solely modulated by the location of tactile stimuli applied on a handheld rod. Source reconstruction suggested that this alpha power modulation was localized in a network of fronto-parietal regions previously implicated in higher-order tactile and spatial processing. These findings are the first to implicate alpha oscillations in tool-extended sensing and suggest an important role for processing touch in external space when localizing touch on a tool.


Assuntos
Processamento Espacial , Percepção do Tato , Mãos , Humanos , Lobo Parietal , Percepção Espacial , Tato
10.
Ear Hear ; 43(1): 192-205, 2022.
Artigo em Inglês | MEDLINE | ID: mdl-34225320

RESUMO

OBJECTIVES: The aim of this study was to assess three-dimensional (3D) spatial hearing abilities in reaching space of children and adolescents fitted with bilateral cochlear implants (BCI). The study also investigated the impact of spontaneous head movements on sound localization abilities. DESIGN: BCI children (N = 18, aged between 8 and 17) and age-matched normal-hearing (NH) controls (N = 18) took part in the study. Tests were performed using immersive virtual reality equipment that allowed control over visual information and initial eye position, as well as real-time 3D motion tracking of head and hand position with subcentimeter accuracy. The experiment exploited these technical features to achieve trial-by-trial exact positioning in head-centered coordinates of a single loudspeaker used for real, near-field sound delivery, which was reproducible across trials and participants. Using this novel approach, broadband sounds were delivered at different azimuths within the participants' arm length, in front and back space, at two different distances from their heads. Continuous head-monitoring allowed us to compare two listening conditions: "head immobile" (no head movements allowed) and "head moving" (spontaneous head movements allowed). Sound localization performance was assessed by computing the mean 3D error (i.e. the difference in space between the X-Y-Z position of the loudspeaker and the participant's final hand position used to indicate the localization of the sound's source), as well as the percentage of front-back and left-right confusions in azimuth, and the discriminability between two nearby distances. Several clinical factors (i.e. age at test, interimplant interval, and duration of binaural experience) were also correlated with the mean 3D error. Finally, the Speech Spatial and Qualities of Hearing Scale was administered to BCI participants and their parents. RESULTS: Although BCI participants distinguished well between left and right sound sources, near-field spatial hearing remained challenging, particularly under the " head immobile" condition. Without visual priors of the sound position, response accuracy was lower than that of their NH peers, as evidenced by the mean 3D error (BCI: 55 cm, NH: 24 cm, p = 0.008). The BCI group mainly pointed along the interaural axis, corresponding to the position of their CI microphones. This led to important front-back confusions (44.6%). Distance discrimination also remained challenging for BCI users, mostly due to sound compression applied by their processor. Notably, BCI users benefitted from head movements under the "head moving" condition, with a significant decrease of the 3D error when pointing to front targets (p < 0.001). Interimplant interval was correlated with 3D error (p < 0.001), whereas no correlation with self-assessment of spatial hearing difficulties emerged (p = 0.9). CONCLUSIONS: In reaching space, BCI children and adolescents are able to extract enough auditory cues to discriminate sound side. However, without any visual cues or spontaneous head movements during sound emission, their localization abilities are substantially impaired for front-back and distance discrimination. Exploring the environment with head movements was a valuable strategy for improving sound localization within individuals with different clinical backgrounds. These novel findings could prompt new perspectives to better understand sound localization maturation in BCI children, and more broadly in patients with hearing loss.


Assuntos
Implante Coclear , Implantes Cocleares , Perda Auditiva , Localização de Som , Percepção da Fala , Adolescente , Criança , Implante Coclear/métodos , Movimentos da Cabeça , Audição , Humanos
11.
Iperception ; 12(6): 20416695211058476, 2021 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-34900214

RESUMO

Following superior parietal lobule and intraparietal sulcus (SPL-IPS) damage, optic ataxia patients underestimate the distance of objects in the ataxic visual field such that they produce hypometric pointing errors. The metrics of these pointing errors relative to visual target eccentricity fit the cortical magnification of central vision. The SPL-IPS would therefore implement an active "peripheral magnification" to match the real metrics of the environment for accurate action. We further hypothesized that this active compensation of the central magnification by the SPL-IPS contributes to actual object' size perception in peripheral vision. Three optic ataxia patients and 10 age-matched controls were assessed in comparing the thickness of two rectangles flashed simultaneously, one in central and another in peripheral vision. The bilateral optic ataxia patient exhibited exaggerated underestimation bias and uncertainty compared to the control group in both visual fields. The two unilateral optic ataxia patients exhibited a pathological asymmetry between visual fields: size perception performance was affected in their contralesional peripheral visual field compared to their healthy side. These results demonstrate that the SPL-IPS contributes to accurate size perception in peripheral vision.

12.
Science ; 374(6569): eabe0874, 2021 Nov 12.
Artigo em Inglês | MEDLINE | ID: mdl-34762470

RESUMO

Does tool use share syntactic processes with language? Acting with a tool is thought to add a hierarchical level into the motor plan. In the linguistic domain, syntax is the cognitive function handling interdependent elements. Using functional magnetic resonance imaging, we detected common neurofunctional substrates in the basal ganglia subserving both tool use and syntax in language. The two abilities elicited similar patterns of neural activity, indicating the existence of shared functional resources. Manual actions and verbal working memory did not contribute to this common network. Consistent with the existence of shared neural resources, we observed bidirectional behavioral enhancement of tool use and syntactic skills in language so that training one function improves performance in the other. This reveals supramodal syntactic processes for tool use and language.


Assuntos
Gânglios da Base/fisiologia , Cognição , Idioma , Aprendizagem , Desempenho Psicomotor , Mapeamento Encefálico , Feminino , Humanos , Linguística , Imageamento por Ressonância Magnética , Masculino , Memória de Curto Prazo , Vias Neurais , Adulto Jovem
13.
Cereb Cortex Commun ; 2(3): tgab054, 2021.
Artigo em Inglês | MEDLINE | ID: mdl-34604753

RESUMO

Anti-saccades are eye movements that require inhibition to stop the automatic saccade to the visual target and to perform instead a saccade in the opposite direction. The inhibitory processes underlying anti-saccades have been primarily associated with frontal cortex areas for their role in executive control. Impaired performance in anti-saccades has also been associated with the parietal cortex, but its role in inhibitory processes remains unclear. Here, we tested the assumption that the dorsal parietal cortex contributes to spatial inhibition processes of contralateral visual target. We measured anti-saccade performance in 2 unilateral optic ataxia patients and 15 age-matched controls. Participants performed 90 degree (across and within visual fields) and 180 degree inversion anti-saccades, as well as pro-saccades. The main result was that our patients took longer to inhibit visually guided saccades when the visual target was presented in the ataxic hemifield and the task required a saccade across hemifields. This was observed through anti-saccades latencies and error rates. These deficits show the crucial role of the dorsal posterior parietal cortex in spatial inhibition of contralateral visual target representations to plan an accurate anti-saccade toward the ipsilesional side.

14.
Neuropsychologia ; 161: 108013, 2021 10 15.
Artigo em Inglês | MEDLINE | ID: mdl-34474063

RESUMO

Attentional resource and distribution are specifically impaired in simultanagnosia, and also in the visuo-attentional form of developmental dyslexia. Both clinical conditions are conceived as a limitation of simultaneous visual processing after superior parietal lobule (SPL) dysfunction (review in Valdois et al., 2019). However, a reduced space-based attentional window (i.e. a limited visual eccentricity at which the target object can be identified, Khan et al. 2016) has been demonstrated in simultanagnosia versus a reduced object-based span (i.e. a limited number of objects processed at each fixation, Bosse et al., 2007) in developmental dyslexia. In healthy individuals, the cost in reaction times per item in serial search tasks suggests that a group of objects is processed simultaneously at a time, but this group is also undefined and depends on the visual complexity of the task. Healthy individuals and a patient with simultanagnosia performed serial search tasks involving either symbols (made of separable features) or objects made of non-separable features, and with distractors that were either all identical or all dissimilar. We used a moving window paradigm to determine whether the task was performed with a "working space" versus a "working span" limitation in control group and in patient with bilateral SPL damage. We found that healthy individuals performed search in a color task comprising non-separable feature objects and dissimilar distractors with a limited space-based attentional window; this attentional window, as well as the mean saccade amplitude used to displace it across the visual display, were independent of set size, thus inconsistent with an object-based attentional span. In the symbol task comprising a feature-absent search in which all feature-present distractors were dissimilar, we observed that mean saccade amplitude decreased with set size and that search performance could not be mimicked by a moving window of a single diameter; instead participants seemed to process a fixed number of symbols at a time (object-based span). Following bilateral SPL lesions, patient IG demonstrated a similar space-based search process in the color search task with a normal attentional window. In contrast, her cost-per-item in the symbol task increased dramatically, demonstrating a clear deficit of simultaneous object perception. These results confirmed the specific contribution of the SPL to the visual processing of multiple objects made of separable features (like letters), and more dramatically when they are all different, which explains the specific difficulty for a reading beginner in case of SPL dysfunction.


Assuntos
Lobo Parietal , Movimentos Sacádicos , Cognição , Feminino , Humanos , Lobo Parietal/diagnóstico por imagem , Tempo de Reação , Percepção Visual
15.
Front Psychol ; 11: 510787, 2020.
Artigo em Inglês | MEDLINE | ID: mdl-33192759

RESUMO

Previous research using immersive virtual reality (VR) has shown that after a short period of embodiment by White people in a Black virtual body, their implicit racial bias against Black people diminishes. Here we tested the effects of some socio-cognitive variables that could contribute to enhancing or reducing the implicit racial bias. The first aim of the study was to assess the beneficial effects of cooperation within a VR scenario, the second aim was to provide preliminary testing of the hypothesis that empathy and political attitudes could contribute to implicit bias about race, while the third aim was to explore the relationship between political attitudes and empathy. We had (Caucasian) participants embodied in a Black virtual body and engaged either in a cooperative (Coop group) or in a non-cooperative (Neutral group) activity with a confederate experimenter embodying another Black avatar. Before and after VR, we measured participants' implicit racial bias by means of Implicit Association Test (IAT) and their perceived closeness toward the confederate experimenter. Before VR we also assessed participants' political attitudes and empathy traits. Results revealed that, as compared to the Neutral group, the Coop group showed lower IAT scores after the social interaction. Interestingly, in the Neutral but not the Coop group the perceived closeness toward the confederate experimenter was associated with the initial racial bias: the more the participants reduced their distance, the more they reduced their IAT score. Moreover, reported traits of empathy and political attitudes significantly explained the variance observed in the initial implicit bias, with perspective-taking, empathic concern, and personal distress being significant predictors of the IAT scores. Finally, there was a relationship between political attitudes and empathy: the more participants considered themselves as left-wing voters, the higher their perspective-taking and empathic concern scores. We discuss these findings within the neuroscientific and social cognition field and encourage scholars from different domains to further explore whether and under which conditions a given manipulation for reducing racial bias could be efficiently transposed in VR.

16.
Neuropsychologia ; 149: 107665, 2020 12.
Artigo em Inglês | MEDLINE | ID: mdl-33130161

RESUMO

When localising sounds in space the brain relies on internal models that specify the correspondence between the auditory input reaching the ears, initial head-position and coordinates in external space. These models can be updated throughout life, setting the basis for re-learning spatial hearing abilities in adulthood. In addition, strategic behavioural adjustments allow people to quickly adapt to atypical listening situations. Until recently, the potential role of dynamic listening, involving head-movements or reaching to sounds, have remained largely overlooked. Here, we exploited visual virtual reality (VR) and real-time kinematic tracking, to study the role of active multisensory-motor interactions when hearing individuals adapt to altered binaural cues (one ear plugged and muffed). Participants were immersed in a VR scenario showing 17 virtual speakers at ear-level. In each trial, they heard a sound delivered from a real speaker aligned with one of the virtual ones and were instructed to either reach-to-touch the perceived sound source (Reaching group), or read the label associated with the speaker (Naming group). Participants were free to move their heads during the task and received audio-visual feedback on their performance. Most importantly, they performed the task under binaural or monaural listening. Results show that both groups adapted rapidly to monaural listening, improving sound localisation performance across trials and changing their head-movement behaviour. Reaching the sounds induced faster and larger sound localisation improvements, compared to just naming its position. This benefit was linked to progressively wider head-movements to explore auditory space, selectively in the Reaching group. In conclusion, reaching to sounds in an immersive visual VR context proved most effective for adapting to altered binaural listening. Head-movements played an important role in adaptation, pointing to the importance of dynamic listening when implementing training protocols for improving spatial hearing.


Assuntos
Localização de Som , Realidade Virtual , Adaptação Fisiológica , Adulto , Sinais (Psicologia) , Audição , Humanos
17.
Sci Rep ; 10(1): 17275, 2020 10 14.
Artigo em Inglês | MEDLINE | ID: mdl-33057121

RESUMO

Following tool-use, the kinematics of free-hand movements are altered. This modified kinematic pattern has been taken as a behavioral hallmark of the modification induced by tool-use on the effector representation. Proprioceptive inputs appear central in updating the estimated effector state. Here we questioned whether online proprioceptive modality that is accessed in real time, or offline, memory-based, proprioception is responsible for this update. Since normal aging affects offline proprioception only, we examined a group of 60 year-old adults for proprioceptive acuity and movement's kinematics when grasping an object before and after tool-use. As a control, participants performed the same movements with a weight-equivalent to the tool-weight-attached to their wrist. Despite hampered offline proprioceptive acuity, 60 year-old participants exhibited the typical kinematic signature of tool incorporation: Namely, the latency of transport components peaks was longer and their amplitude reduced after tool-use. Instead, we observed no kinematic modifications in the control condition. In addition, online proprioception acuity correlated with tool incorporation, as indexed by the amount of kinematics changes observed after tool-use. Altogether, these findings point to the prominent role played by online proprioception in updating the body estimate for the motor control of tools.


Assuntos
Braço/fisiologia , Envelhecimento Saudável/fisiologia , Idoso , Fenômenos Biomecânicos , Feminino , Força da Mão , Humanos , Masculino , Pessoa de Meia-Idade , Propriocepção , Comportamento de Utilização de Ferramentas , Punho/fisiologia
18.
Curr Biol ; 29(24): 4276-4283.e5, 2019 12 16.
Artigo em Inglês | MEDLINE | ID: mdl-31813607

RESUMO

The extent to which a tool is an extension of its user is a question that has fascinated writers and philosophers for centuries [1]. Despite two decades of research [2-7], it remains unknown how this could be instantiated at the neural level. To this aim, the present study combined behavior, electrophysiology and neuronal modeling to characterize how the human brain could treat a tool like an extended sensory "organ." As with the body, participants localize touches on a hand-held tool with near-perfect accuracy [7]. This behavior is owed to the ability of the somatosensory system to rapidly and efficiently use the tool as a tactile extension of the body. Using electroencephalography (EEG), we found that where a hand-held tool was touched was immediately coded in the neural dynamics of primary somatosensory and posterior parietal cortices of healthy participants. We found similar neural responses in a proprioceptively deafferented patient with spared touch perception, suggesting that location information is extracted from the rod's vibrational patterns. Simulations of mechanoreceptor responses [8] suggested that the speed at which these patterns are processed is highly efficient. A second EEG experiment showed that touches on the tool and arm surfaces were localized by similar stages of cortical processing. Multivariate decoding algorithms and cortical source reconstruction provided further evidence that early limb-based processes were repurposed to map touch on a tool. We propose that an elementary strategy the human brain uses to sense with tools is to recruit primary somatosensory dynamics otherwise devoted to the body.


Assuntos
Córtex Somatossensorial/fisiologia , Percepção do Tato/fisiologia , Tato/fisiologia , Adulto , Encéfalo/fisiologia , Eletroencefalografia , Feminino , Humanos , Masculino , Mecanorreceptores/fisiologia , Pessoa de Meia-Idade , Plasticidade Neuronal/fisiologia , Comportamento de Utilização de Ferramentas/fisiologia , Percepção Visual/fisiologia
19.
Psychopharmacology (Berl) ; 236(12): 3641-3653, 2019 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-31384989

RESUMO

Elucidation of how neuromodulators influence motivated behaviors is a major challenge of neuroscience research. It has been proposed that the locus-cœruleus-norepinephrine system promotes behavioral flexibility and provides resources required to face challenges in a wide range of cognitive processes. Both theoretical models and computational models suggest that the locus-cœruleus-norepinephrine system tunes neural gain in brain circuits to optimize behavior. However, to the best of our knowledge, empirical proof demonstrating the role of norepinephrine in performance optimization is scarce. Here, we modulated norepinephrine transmission in monkeys performing a Go/No-Go discrimination task using atomoxetine, a norepinephrine-reuptake inhibitor. We tested the optimization hypothesis by assessing perceptual sensitivity, response bias, and their functional relationship within the framework of the signal detection theory. We also manipulated the contingencies of the task (level of stimulus discriminability, target stimulus frequency, and decision outcome values) to modulate the relationship between sensitivity and response bias. We found that atomoxetine increased the subject's perceptual sensitivity to discriminate target stimuli regardless of the task contingency. Atomoxetine also improved the functional relationship between sensitivity and response bias, leading to a closer fit with the optimal strategy in different contexts. In addition, atomoxetine tended to reduce reaction time variability. Taken together, these findings support a role of norepinephrine transmission in optimizing response strategy.


Assuntos
Inibidores da Captação Adrenérgica/farmacologia , Cloridrato de Atomoxetina/farmacologia , Tomada de Decisões/efeitos dos fármacos , Locus Cerúleo/efeitos dos fármacos , Norepinefrina/antagonistas & inibidores , Tempo de Reação/efeitos dos fármacos , Animais , Cognição/efeitos dos fármacos , Cognição/fisiologia , Tomada de Decisões/fisiologia , Feminino , Locus Cerúleo/fisiologia , Macaca mulatta , Norepinefrina/fisiologia , Tempo de Reação/fisiologia
20.
J Cogn Neurosci ; 31(8): 1141-1154, 2019 08.
Artigo em Inglês | MEDLINE | ID: mdl-30321094

RESUMO

Peripersonal space is a multisensory representation relying on the processing of tactile and visual stimuli presented on and close to different body parts. The most studied peripersonal space representation is perihand space (PHS), a highly plastic representation modulated following tool use and by the rapid approach of visual objects. Given these properties, PHS may serve different sensorimotor functions, including guidance of voluntary actions such as object grasping. Strong support for this hypothesis would derive from evidence that PHS plastic changes occur before the upcoming movement rather than after its initiation, yet to date, such evidence is scant. Here, we tested whether action-dependent modulation of PHS, behaviorally assessed via visuotactile perception, may occur before an overt movement as early as the action planning phase. To do so, we probed tactile and visuotactile perception at different time points before and during the grasping action. Results showed that visuotactile perception was more strongly affected during the planning phase (250 msec after vision of the target) than during a similarly static but earlier phase (50 msec after vision of the target). Visuotactile interaction was also enhanced at the onset of hand movement, and it further increased during subsequent phases of hand movement. Such a visuotactile interaction featured interference effects during all phases from action planning onward as well as a facilitation effect at the movement onset. These findings reveal that planning to grab an object strengthens the multisensory interaction of visual information from the target and somatosensory information from the hand. Such early updating of the visuotactile interaction reflects multisensory processes supporting motor planning of actions.


Assuntos
Espaço Pessoal , Desempenho Psicomotor/fisiologia , Percepção do Tato/fisiologia , Percepção Visual/fisiologia , Adulto , Feminino , Humanos , Masculino , Adulto Jovem
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...